首页> 外文OA文献 >Proximal gradient method for huberized support vector machine
【2h】

Proximal gradient method for huberized support vector machine

机译:Huberized支持向量机的近似梯度法

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

The Support Vector Machine (SVM) has been used in a wide variety ofclassification problems. The original SVM uses the hinge loss function, whichis non-differentiable and makes the problem difficult to solve in particularfor regularized SVMs, such as with $\ell_1$-regularization. This paperconsiders the Huberized SVM (HSVM), which uses a differentiable approximationof the hinge loss function. We first explore the use of the Proximal Gradient(PG) method to solving binary-class HSVM (B-HSVM) and then generalize it tomulti-class HSVM (M-HSVM). Under strong convexity assumptions, we show that ouralgorithm converges linearly. In addition, we give a finite convergence resultabout the support of the solution, based on which we further accelerate thealgorithm by a two-stage method. We present extensive numerical experiments onboth synthetic and real datasets which demonstrate the superiority of ourmethods over some state-of-the-art methods for both binary- and multi-classSVMs.
机译:支持向量机(SVM)已用于各种各样的分类问题。原始的SVM使用铰链损耗函数,该函数不可微且使得该问题难以解决,尤其是对于正则化SVM,例如使用$ \ ell_1 $ -regularization。本文考虑了Huberized SVM(HSVM),它使用铰链损耗函数的可微近似。我们首先探索使用近邻梯度(PG)方法求解二进制类HSVM(B-HSVM),然后将其推广到多类HSVM(M-HSVM)。在强凸假设下,我们证明了算法是线性收敛的。另外,给出了关于该解的支持的有限收敛结果,在此基础上,我们通过两步法进一步加速了算法。我们在合成数据集和真实数据集上均进行了广泛的数值实验,这些实验证明了我们的方法在二进制和多类SVM方面优于某些最新方法的优越性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号